Multi-Task Learning for Classification with Dirichlet Process Priors

نویسندگان

  • Ya Xue
  • Xuejun Liao
  • Lawrence Carin
  • Balaji Krishnapuram
چکیده

Multi-task learning (MTL) is considered for logistic-regression classifiers, based on a Dirichlet process (DP) formulation. A symmetric MTL (SMTL) formulation is considered in which classifiers for multiple tasks are learned jointly, with a variational Bayesian (VB) solution. We also consider an asymmetric MTL (AMTL) formulation in which the posterior density function from the SMTL model parameters, from previous tasks, is used as a prior for a new task; this approach has the significant advantage of not requiring storage and use of all previous data from prior tasks. The AMTL formulation is solved with a simple Markov Chain Monte Carlo (MCMC) construction. Comparisons are also made to simpler approaches, such as single-task learning, pooling of data across tasks, and simplified approximations to DP. A comprehensive analysis of algorithm performance is addressed through consideration of two data sets that are matched to the MTL problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification with Incomplete Data Using Dirichlet Process Priors

A non-parametric hierarchical Bayesian framework is developed for designing a classifier, based on a mixture of simple (linear) classifiers. Each simple classifier is termed a local "expert", and the number of experts and their construction are manifested via a Dirichlet process formulation. The simple form of the "experts" allows analytical handling of incomplete data. The model is extended to...

متن کامل

Focused Multi-task Learning Using Gaussian Processes

Given a learning task for a data set, learning it together with related tasks (data sets) can improve performance. Gaussian process models have been applied to such multi-task learning scenarios, based on joint priors for functions underlying the tasks. In previous Gaussian process approaches, all tasks have been assumed to be of equal importance, whereas in transfer learning the goal is asymme...

متن کامل

Learning Task Relatedness via Dirichlet Process Priors for Linear Regression Models

In this paper we present a hierarchical model of linear regression functions in the context of multi–task learning. The parameters of the linear model are coupled by a Dirichlet Process (DP) prior, which implies a clustering of related functions for different tasks. To make approximate Bayesian inference under this model we apply the Bayesian Hierarchical Clustering (BHC) algorithm. The experim...

متن کامل

Bayesian Multi-Task Compressive Sensing with Dirichlet Process Priors

Compressive sensing (CS) is an emerging field that, under appropriate conditions, can significantly reduce the number of measurements required for a given signal. Specifically, if the m-dimensional signal u is sparse in an orthonormal basis represented by the m × m matrix Ψ, then one may infer u based on n m projection measurements. If u = Ψθ, where θ are the sparse coefficients in basis Ψ, the...

متن کامل

A 'Gibbs-Newton' Technique for Enhanced Inference of Multivariate Polya Parameters and Topic Models

Hyper-parameters play a major role in the learning and inference process of latent Dirichlet allocation (LDA). In order to begin the LDA latent variables learning process, these hyperparameters values need to be pre-determined. We propose an extension for LDA that we call ‘Latent Dirichlet allocation Gibbs Newton’ (LDA-GN), which places non-informative priors over these hyper-parameters and use...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 8  شماره 

صفحات  -

تاریخ انتشار 2007